Evolution, Medicine, and Public Health
◐ Oxford University Press (OUP)
All preprints, ranked by how well they match Evolution, Medicine, and Public Health's content profile, based on 14 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit. Older preprints may already have been published elsewhere.
Lange, E. C.; Zeng, S.; Campos, F. A.; Li, F.; Tung, J.; Archie, E. A.; Alberts, S. C.
Show abstract
Does social isolation in adulthood predict survival because socially isolated individuals are already unhealthy due to adversity earlier in life (health selection)? Or do adult social environments directly cause poor health and increased mortality risk ("social causation")? These alternative hypotheses are difficult to disentangle in humans because prospective data on survival and the environment for both early life and adulthood are rarely available. Using data from the baboon population of Amboseli, Kenya, a model for human behavior and aging, we show that early adversity and adult social isolation contribute independently to reduced adult survival, in support of both health selection and social causation. Further, strong social bonds and high social status can buffer some negative effects of early adversity on survival. These results support a growing change in perspective, away from "either-or" hypotheses and towards a multi-causal perspective that points to multiple opportunities to mitigate the effects of social adversity. TeaserEarly life environments and adult social bonds have strong, but largely independent effects on survival in wild baboons.
Aldakak, L.; Rühli, F.; Bender, N.
Show abstract
1.Sex differences in immunity have been described in humans and other mammal species. Females have a lower incidence of infections and non-reproductive malignancies and exhibit higher antibody levels after vaccination. Existing evolutionary explanations are based on differences in reproductive strategies and reaction to extrinsic differences in susceptibility and virulence between the sexes. Here, we test the hypothesis that known differences in the probability of transmission and outcome of sexually transmitted infections contribute to sex differences in immunocompetence. We modelled reproductive and immune investments against a fertility limiting Sexually Transmitted Infection (STI). We show that, in line with previous findings, increased susceptibility selects for tolerance to the parasite while increased virulence selects for resistance against it. Differences in reproductive strategies between the sexes lead to sex differences in immunocompetence, mostly with higher competence in females. Extrinsic differences in susceptibility and virulence between the sexes can augment or alleviate the evolutionary consequences of intrinsic differences depending on their direction and magnitude. This indicates that the selection of sex-specific immune strategies is less predictable than thought before and explains why sex differences in immunity have been found to be not universal and pervasive across animal species.
Kapsetaki, S. E.; Compton, Z.; Rupp, S. M.; Garner, M. M.; Duke, E. G.; Boddy, A. M.; Harrison, T. M.; Aktipis, A.; Maley, C. C.
Show abstract
The ecology in which species live and evolve likely affects their health and vulnerability to diseases including cancer. Using 14,267 necropsy records across 244 vertebrate species, we tested if animals in low productivity habitats, with large habitat range, high body temperature and weight-inferred estimates of metabolic rates, and in high trophic levels (from lowest to highest: herbivores, invertivores, primary carnivores, and secondary carnivores) are linked with having increased prevalence of neoplasia. This study found that: (1) habitat productivity negatively correlated with the prevalence of malignancy and neoplasia across tissues, and malignancy and neoplasia in gastrointestinal tissues; (2) inferred metabolic rates negatively correlated with the prevalence of neoplasia; and (3) trophic levels positively correlated with malignancy and neoplasia prevalence in both mammals and non-mammals. However, only the correlations with trophic levels remained significant after Bonferroni corrections for multiple testing. There are several mechanisms that might explain these findings, including the biomagnification of carcinogens in higher trophic levels, as well as tradeoffs between cancer suppression versus reproduction and survival in low productivity environments.
Piscitelli, A. P.; Messina, S.; Wauters, L. A.; Santicchia, F.; Matthysen, E.; Leirs, H.; Vanden Broecke, B.
Show abstract
Animal personality and parasite infections are key forces shaping the ecology and evolution of natural populations. Personality traits--such as activity, exploration, and boldness--shape how individuals interact with their environment and conspecifics, influencing both their exposure and susceptibility to parasite infection. In turn, parasites can impact host fitness and energy allocation, and may modify host behaviour either through manipulations to enhance transmission or as consequences of energetic trade-offs associated with mounting an immune response. Despite growing interest in the interplay between behaviour and infection, the overall directionality and consistency of personality-parasite relationships remain unclear. This relationship is further modulated by ecological and biological factors, such as parasite type (e.g. micro-, ecto-, or endoparasites) and host type (e.g. intermediate versus definitive), which can influence both infection risk and the nature of behavioural responses. To disentangle these effects, we performed a meta-analysis of 226 effect sizes across 80 studies, assessing (i) the impact of experimental infections on host personality traits, and (ii) the correlation between personality and infection status in observational studies of wild populations--while accounting for variation in parasite groups and host roles. In experimental studies, infected hosts exhibited significantly reduced levels of activity and exploration, while effects on boldness and aggressiveness were non-significant. These findings suggest that infection imposes energetic costs that suppress behaviours requiring sustained effort, such as movement and exploration. Conversely, observational studies showed a positive association between activity-exploration and infection probability, likely reflecting greater exposure of more active individuals to parasites via increased interaction with conspecifics or contaminated environments. Meta-regression analyses further revealed that parasite type and host role modulate personality-infection dynamics. In experimental studies, microparasites were associated with reduced boldness and activity-exploration, while endoparasites led to reduced activity- exploration--particularly in intermediate hosts. Notably, hosts showed significant behavioural suppression in experimental contexts, but not in observational studies, potentially indicating that behaviourally tolerant individuals are favoured in natural environments where personality traits relate directly to fitness. Together, these findings underscore the importance of ecological context and study design in interpreting personality-parasite associations. Experimental infections tend to reveal the physiological costs of infection, while observational studies highlight behavioural traits that modulate infection risk. By integrating data across host types, parasite groups, and methodological approaches, our meta-analysis provides a more comprehensive understanding of how personality and infection interact. These insights contribute to a broader effort to link behavioural ecology with disease ecology, clarifying how individual variation in behaviour shapes--and is shaped by--host-parasite dynamics.
Valenta, K.; Grebe, N.; Kelly, T.; Applebaum, J. W.; Stern, A.; Traff, J.; Satishchandran, S.; Rosenbaum, S.; Lantigua, V.; Lee, A. C. Y.
Show abstract
Parasitism is one of the key, structural, interspecific interactions in ecology. One remarkable parasitic strategy that has been documented in multiple systems is the behavioral manipulation of hosts to increase parasite fitness. While not yet documented in humans, we propose that a ubiquitous zoonotic parasite - Toxoplasma gondii - may change human behavior to favor the parasite by increasing the fitness of the parasites definitive host - cats. Specifically, we assess the possibility that human behavioral changes resulting from chronic, latent T. gondii infection lead to measurable changes in attitudes, actions and dopaminergic responses towards cats that function to increase domestic cat fitness. We assessed the potential role of humans in the T. gondii lifecycle by identifying and testing behavioral changes in humans that benefit the parasite; specifically, human affection for cats. We assessed T. gondii infection status in 68 participants using T. gondii serum antibody testing, and assessed their attitudes towards cats in three ways: i) surveys, ii) participant behavior in the presence of domestic cats, and iii) participant oxytocin levels before and after interactions with cats to assess dopaminergic changes. Only 2 of 68 participants were positive for T. gondii antibodies, limiting statistical power. However, our results indicated that T. gondii-positive participants both reported a greater affection for cats in surveys, and spent more time engaged with cats during behavioral trials than T. gondii-negative participants (87% of study time engaging with cats vs 75%). Oxytocin results were inconclusive.
Farahani, H. K.; Lacombrade, M.; Perez, G. M.; Monchanin, C.; Lihoreau, M.
Show abstract
Aging induces cognitive decline in humans and some other animals. For species that rely on learning and memory for reproduction, impaired cognitive functions may incur severe fitness costs. Here we report age-related cognitive decline in a solitary parasitoid wasp, Venturia canescens, that uses olfactory memories for host seeking and selection. We trained individual wasps to associate an odour with an oviposition reward, and compared their learning and memory performances at different stages of the reproductive life. Wasps between 6- and 14-day-old showed consistently poorer learning and reduced memory retention than young conspecifics, and this tendency increased with age. In this parasitoid insect, aging induces a precocious cognitive decline in reproductive females, which could severely impact their fitness through altered abilities to identify high quality hosts.
Binsted, L. E.; McNally, L.
Show abstract
Antimicrobial resistance (AMR) poses an urgent public health challenge. To improve patient outcomes and design interventions we must identify patient characteristics which predict the presence of AMR pathogens. One potential and commonly collected patient characteristic is host age, consensus remains elusive regarding its impact on the probability of infecting pathogens being resistant to antimicrobials. Here, we employ a meta-analysis to consolidate and compare these previous studies and examine the relationship between antibiotic resistance and host age across bacteria and antibiotics. We show that although the probability that infecting bacteria are antimicrobial resistant increases with host age on average, diverse patterns exist across antibiotic classes and bacterial genera, including negative, humped, and U-shaped relationships. We further illustrate, using a compartmental epidemiological model, that this variation is likely driven by differences in antibiotic consumption or incidence of bacterial infection/carriage between age groups, combined with age assortative transmission. These findings imply that empirical antibiotic therapy could be improved by considering age-specific local resistance levels (compared with overall local resistance levels), resulting in improved treatment success and reduced spread of antibiotic resistance. They additionally display consequences of assuming population homogeneity in epidemiological models. Finally, they indicate that the landscape of the already severe resistance crisis is likely to change as the age distribution of the human population shifts.
Jacobs, N. T.; Weiser, J. N.
Show abstract
Pathogens experience selection at multiple scales, given the need to transmit between hosts and replicate within them. This presents the challenge of cross-scale selective conflict when adaptations to one scale compromise fitness at another, such as mutations that improve transmissibility but make individuals less competitive within hosts. Selection operates differently at these scales, with tight transmission bottlenecks subjecting pathogen populations to genetic drift, and large population sizes within hosts enabling efficient selection for beneficial mutations. Compounding the reduction in diversity by transmission bottlenecks is the occupant-intruder competitive strategy exhibited by some pathogens, where the first variant to colonize a host prevents later arriving variants from contributing to infection, preventing immigration and turning transmission into a "founder takes all" contest. Here, we used multiple modeling approaches to examine how this behavior affects the efficiency of selection for both transmissibility and within- host fitness. We find that in the face of a trade-off, selection for transmissibility is maximized under a tight transmission bottleneck that minimizes within-host competition during colonization. While mutations with increased within-host fitness are favored during within-host replication, an occupant-intruder strategy prevents these mutants from displacing established residents and propagating across the host population, leading to their extinction if they are insufficiently transmissible. Finally, a model of competition on the scale of the host population revealed that competitive exclusion limits the propagation of mutations with improved within-host fitness, unless resident populations can incorporate alleles from intruding variants by recombination. Thus, competitive exclusion may facilitate the improvement and maintenance of pathogen transmissibility, with directional recombination allowing resident populations to mitigate the potential loss of within-host fitness imposed by this occupant-intruder strategy. Author SummaryTransmission is a defining feature of infectious diseases, and so a better understanding of how transmissibility evolves is important for improving disease surveillance and prevention. Successful transmission is often achieved by a small number of individuals which, after establishing residency in a host, may prevent newcomers from participating in infection. Here, we use modeling to examine how competitive exclusion of challengers by resident populations affects the balance between within-host competitive ability and transmissibility. We find that competitive exclusion strengthens selection for transmissibility by disproportionately benefitting the first variant to colonize a host and preventing mutants that may be more competitive but less transmissible from displacing established residents. Competitive exclusion also limits the propagation of mutants that improve within-host fitness without reducing transmissibility, however, increasing the advantage of recombination that allows resident populations to acquire beneficial alleles from challengers. Competitive strategies that allow pathogens to "claim ownership" of hosts may thus help pathogen populations maintain transmissibility, with genetic recombination facilitating within-host adaptation through the incorporation of beneficial alleles from challengers.
Mouchka, M. E.; Dorsey, D. M.; Malcangio, G. L.; Medina, S. J.; Stuart, E. C.; Meyer, J. R.
Show abstract
The concept of evolvability (the capacity of populations to evolve) has deep historical roots in evolutionary biology. Interest in the subject has been renewed recently by innovations in microbiology that permit direct tests of the causes of evolvability, and with the acknowledgement that evolvability of pathogens has important implications for human health. Here, we investigate how fluctuating selection on the virus, Bacteriophage {lambda}, affects its evolvability. We imposed dynamic selection by altering the expression of two host outer membrane receptors. This, in turn, selected phage to alternately infect the host via a single, or multiple, receptors. Our selection regime resulted in two orthogonal evolutionary behaviors, namely enhanced or reduced evolvability. Strains with enhanced evolvability readily evolved between receptors, losing and gaining the ability to bind multiple receptors more quickly than the ancestral {lambda}. This suggests the receptor-binding protein retained a genetic memory of past states and that evolutionary history can be used to predict future adaptation. Strains with reduced evolvability were refractory to re-specialization and remained generalists on both receptors. Consistent with this behavior, unevolvable strains had reduced rates of molecular evolution in the receptor-binding protein compared to their evolvable counterparts. We found a single mutation in the receptor-binding protein was sufficient to render these strains resistant to evolution and did so by counteracting a receptor-binding trade-off associated with generalism. In this way, cost-free generalization allowed for reduced evolution and evolvability while maximizing success in both environments. Our results suggest the response to fluctuating selection is contingent and can lead to distinct differences in evolvability. These findings contribute to a growing understanding of the causes and consequences of evolvability and have important implications for infectious disease management.
Gomez, L. M.; Meszaros, V. A.; Turner, W. C.; Ogbunugafor, C. B.
Show abstract
The relationship between parasite virulence and transmission is a pillar of evolutionary theory that has specific implications for public health. Part of this canon involves the idea that virulence and free-living survival (a key component of transmission) may have different relationships in different host-parasite systems. Most examinations of the evolution of virulence-transmission relationships--theoretical or empirical in nature--tend to focus on the evolution of virulence, with transmission a secondary consideration. And even within transmission studies, the focus on free-living survival is a smaller subset, though recent studies have examined its importance in the ecology of infectious diseases. Few studies have examined the epidemic-scale consequences of variation in survival across different virulence-survival relationships. In this study, we utilize a mathematical model motivated by aspects of SARS-CoV-2 natural history to investigate how evolutionary changes in survival may influence several aspects of disease dynamics at the epidemiological scale. Across virulence-survival relationships (where these traits are positively or negatively correlated), we found that small changes (5% above and below the nominal value) in survival can have a meaningful effect on certain outbreak features, including the R0, and the size of the infectious peak in the population. These results highlight the importance of properly understanding the mechanistic relationship between virulence and parasite survival, as evolution of increased survival across different relationships with virulence will have considerably different epidemiological signatures.
Hochberg, M. E.
Show abstract
Multicellular organisms are confronted not only with germline mutations, but also mutations emerging in somatic cells. Somatic mutations can lead to various conditions, diseases, and cancers in particular. Somatic mutation rate is limited by evolved protection mechanisms, notably those repairing damaged DNA or eliminating mutated cells. However, in a broader context, life history traits such as body mass, age of first reproduction and reproductive lifespan, can also be subject to selection due to the negative fitness impacts of disease. Here, I analyze a simple coevolutionary model of somatic mutation rate (SMR) and fitness lifespan (hereafter called fitspan), the latter measured as the age at which inclusive fitness becomes negligible. Evolution in the model is driven by the fitness costs of disease, because as organisms age: disease is more extensive, disease prevention mechanisms are less effective and more costly, and fitness payoffs of disease prevention are lower. I investigate relations between selective forces and (co)evolutionary responses, notably showing the possibility of either monotone or oscillatory non-equilibrium dynamics and fast or slow returns to equilibrium. I then compare model predictions to recently published data on body mass, lifespan and somatic mutation rate. I show that the model (1) can explain the non-linear empirical relationship between somatic mutation and lifespan, (2) predicts the evolution of longer lifespans through a heretofore ignored feedback loop, and (3) the model is consistent with the idea that the linear relation between somatic mutation accumulation and age is the net result of mutational washing-out. I argue that the findings here generalize to other decreases in condition with age that are submitted to selection, including aging itself.
Azhir, A.; Cheng, J.; Tian, J.; Bassett, I. V.; Patel, C. J.; Klann, J. G.; Murphy, S. N.; Estiri, H.
Show abstract
BackgroundOlder age is widely considered a risk factor for post-acute sequelae of SARS-CoV-2 infection (PASC), typically attributed to immunosenescence and inflammaging. However, whether this association reflects intrinsic biological ageing or accumulated comorbidity burden remains unclear, with implications for clinical risk stratification. MethodsWe conducted a retrospective cohort study using the Precision PASC Research Cohort (P2RC) from Mass General Brigham, comprising 133,792 COVID-19 patients from 12 hospitals and 20 community health centres in Massachusetts (March 2020-May 2024). PASC was ascertained using a validated computational phenotyping algorithm. We used generalised estimating equations with cluster-robust variance to model PASC risk, causal mediation analysis to decompose age effects through comorbidity burden and acute severity, and specification curve analysis across 768 analytical specifications to assess robustness. FindingsAfter adjustment for comorbidity burden, each decade of age was associated with 6% lower odds of PASC (OR 0.94; 95% CI 0.93-0.95). Causal mediation analysis revealed that comorbidities accounted for 145% of the total age effect, indicating inconsistent mediation wherein ages direct protective effect was masked by its indirect harm through chronic disease accumulation. This protection was age-dependent: adults younger than 65 years retained robust resilience independent of comorbidities (ADE:-0.0042, p<0.001), whereas adults 65 years and older showed complete loss of this protection (ADE: +0.0020, p=0.14). InterpretationLong COVID susceptibility is driven by physiological reserve rather than chronological age until approximately age 65, beyond which age-related protective mechanisms become exhausted. Risk stratification should prioritise comorbidity burden over birth year in younger adults. FundingNational Institute of Allergy and Infectious Diseases (NIAID).
Mejia-Guevara, I.; Gazeley, U.; Nabukalu, D.; Aburto, J. M.
Show abstract
Despite recent progress, Sub-Saharan Africa (SSA) continues to have the lowest life expectancy and poorest health outcomes globally. While monitoring life expectancy trends is essential, a comprehensive understanding of mortality reduction requires examining age-specific death distributions. Using data from the 2024 Revision of the UN World Population Prospects, we analyze trends in life expectancy and lifespan variation across 49 SSA countries from 1950 to 2023, with projections to 2050, to assess the impact of mortality changes at young, adult, and old ages. We examine three periods--1950-1979, 1980-2004, and 2005-2023--selected based on key epidemiological milestones, including the emergence of the HIV epidemic and the launch of major global health initiatives around 2000. The impact of mortality reductions varies by age group, resulting in differing patterns of life expectancy gains and lifespan variation. Reductions in child mortality consistently contributed to increased life expectancy and decreased variation, while mid- and old-age mortality showed contrasting trends. Notably, this study offers a novel contribution by examining the impact of HIV-related mortality shocks on lifespan variation. Our findings challenge conventional frameworks--such as the demographic and epidemiological transitions--in capturing the complexity of mortality change in SSA and call for a reevaluation of prevailing narratives.
Anderson, K.-A. M.; Creanza, N.
Show abstract
Health perceptions and health-related behaviors can change at the population level as cultures evolve. In the last decade, despite the proven efficacy of vaccines, the developed world has seen a resurgence of vaccine-preventable diseases (VPDs) such as measles, pertussis, and polio. Vaccine hesitancy, an individual attitude influenced by historical, political, and socio-cultural forces, is believed to be a primary factor responsible for decreasing vaccine coverage, thereby increasing the risk and occurrence of VPD outbreaks. In recent years, mathematical models of disease dynamics have begun to incorporate aspects of human behavior, however they do not address how beliefs and motivations influence these health behaviors. Here, using a mathematical modeling framework, we explore the effects of cultural evolution on vaccine hesitancy and vaccination behavior. With this model, we shed light on facets of cultural evolution (vertical and oblique transmission, homophily, etc.) that promote the spread of vaccine hesitancy, ultimately affecting levels of vaccination coverage and VPD outbreak risk in a population. In addition, we present our model as a generalizable framework for exploring cultural evolution when humans beliefs influence, but do not strictly dictate, their behaviors. This model offers a means of exploring how parents potentially conflicting beliefs and cultural traits could affect their childrens health and fitness. We show that vaccine confidence and vaccine-conferred benefits can both be driving forces of vaccine coverage. We also demonstrate that an assortative preference among vaccine-hesitant individuals can lead to increased vaccine hesitancy and lower vaccine coverage.
Visher, E.; Mahjoub, H.; Soufi, K.; Pascual, N.; Hoang, V.; Bartlett, L. J.; Roberts, K. E.; Meaden, S.; Boots, M.
Show abstract
Hosts can often evolve resistance to parasites (and other stressors), but such resistance is generally thought to be constrained by trade-offs with other traits. These trade-offs determine the hosts optimal resistance strategy and whether resistance cycles, diversifies, and/or is maintained in the absence of parasite. However, trade-offs are often inconsistently measured across experiments and can depend on environmental conditions. Here, we extend a selection experiment evolving resistance to viral infection under variable resource quality in the Plodia interpunctella model system to explore the evolutionary conditions leading to an incongruent earlier measurement of costless resistance. We find that environmental resource quality, historical contingency, and the time scale of selection all affect trade-offs in our long-term selection experiment. Specifically, populations selected for resistance with the dual stressor of low resource quality are slowed, but not prevented, from evolving resistance. Second, variation in starting populations or early sampled adaptations led to contingency towards context-dependent resistance. Finally, some costs to resistance observed at early time points were compensated over longer evolutionary time scales. Our work therefore informs perspectives for the predictability of adaptation and how variation in specific evolutionary conditions can alter the evolutionary trajectories of a population towards costly or costless resistance strategies.
Castledine, M.; Szczutkowska, Z.; Matthews, A.; Walsh, S. K.; Lewis, R.; Kay, S.; Willment, J. A.; Brown, G. D.; Buckling, A.
Show abstract
Phage therapy, the use of viruses that infect bacteria (bacteriophages), is a promising complement to antibiotics during the antimicrobial resistance crisis, but treatment success is very variable. Evolution of bacterial resistance to bacteriophages and bacteriophage counter-resistance (coevolution) during therapy may explain some of this variation, the dynamics of which may be affected by interactions with the patients immune system. Here, we examine how a pathogenic bacterium, Pseudomonas aeruginosa coevolves with two clinically relevant bacteriophages (14-1 and PNM) when in the presence of macrophages (RAW 264.7 cell line). We show macrophages reduced the rate by which bacteria were killed by bacteriophages, likely by reducing bacteria-bacteriophage contact rates. Over evolutionary time-scales, macrophages increased the proportion of bacteriophage resistant bacteria compared to where macrophages were absent. These differences in resistance rates were likely driven by the early advantage in density offered by macrophages to bacteria, and exclusion of PNM from the bacteriophage cocktail which otherwise increased in frequency in the absence of macrophages. Consequently, macrophages significantly altered the short- and long-term efficacy of a bacteriophage cocktail. In line with a growing body of work, our results suggest that the patients immune system can reduce the efficacy of phage therapy, potentially driving variable outcomes in therapy success in patients. Significance statementPhage therapy, the use of viruses that infect bacteria (bacteriophages), is a promising complement to antibiotics during the antimicrobial resistance crisis. However, treatment success is very variable. One, often overlooked, variable is the immune system and how this influences bacteriophage efficacy, and how bacteria evolve resistance to bacteriophages. We find that macrophages reduce the rate by which bacteria are killed by bacteriophages. Resistance to bacteriophages also increased in the presence of macrophages, showing macrophages affect the short- and long-term efficacy of phage therapy. These results highlight the importance of the immune system in phage therapy, and the need for more research in this area.
Wen, F. T.; Malani, A.; Cobey, S.
Show abstract
1Although vaccines against seasonal influenza are designed to protect against circulating strains, by affecting the emergence and transmission of antigenically divergent strains, they might also change the rate of antigenic evolution. Vaccination might slow antigenic evolution by increasing immunity, reducing the chance that even antigenically diverged strains can survive. Vaccination also reduces prevalence, decreasing the supply of potentially beneficial mutations and increasing the probability of stochastic extinction. But vaccination might accelerate antigenic evolution by increasing the transmission advantage of more antigenically diverged strains relative to less diverged strains (i.e., by positive selection). Such evolutionary effects could affect vaccinations direct benefits to individuals and indirect benefits to the host population (i.e., the private and social benefits). To investigate these potential impacts, we simulated the dynamics of an influenza-like pathogen with seasonal vaccination. On average, more vaccination decreased the rate of viral antigenic evolution and the incidence of disease. Notably, this decrease was driven partly by a vaccine-induced decline in the rate of antigenic evolution. To understand how the evolutionary effects of vaccines might affect their social and private benefits, we fitted linear panel models to simulated data. By slowing evolution, vaccination increased the social benefit and decreased the private benefit. Thus, in the long term, vaccinations potential social and private benefits may differ from current theory, which omits evolutionary effects. These results suggest that conventional seasonal vaccines against influenza, if protective against transmission and given to the appropriate populations, could further reduce disease burden by slowing antigenic evolution.
Vagasi, C. I.; Vincze, O.; Adamkova, M.; Kauzalova, T.; Lendvai, A. Z.; Patras, L.; Penzes, J.; Pap, P. L.; Albrecht, T.; Tomasek, O.
Show abstract
Chronically high blood glucose levels (hyperglycaemia) can compromise healthy ageing and lifespan at the individual level. Elevated oxidative stress can play a central role in hyperglycaemia-induced pathologies. Nevertheless, the lifespan of birds shows no species-level association with blood glucose. This suggests that the potential pathologies of high blood glucose levels can be avoided by adaptations in oxidative physiology at the macroevolutionary scale. However, this hypothesis remains unexplored. Here, we examined this hypothesis using comparative analyses controlled for phylogeny, allometry and fecundity based on data from 51 songbird species (681 individuals with blood glucose and 1021 individuals with oxidative state data). We measured blood glucose at baseline and after stress stimulus and computed glucose stress reactivity as the magnitude of change between the two time points. We also measured three parameters of non-enzymatic antioxidants (uric acid, total antioxidants and glutathione) and a marker of oxidative lipid damage (malondialdehyde). We found no clear evidence for blood glucose concentration being correlated with either antioxidant or lipid damage levels at the macroevolutionary scale, as opposed to the hypothesis postulating that high blood glucose levels entail oxidative costs. The only exception was the moderate evidence for species with a stronger stress-induced increase in blood glucose concentration evolving moderately lower investment into antioxidant defence (uric acid and glutathione). Neither baseline nor stress-induced glucose levels were associated with oxidative physiology. Our findings support the hypothesis that birds evolved adaptations preventing the (glyc)oxidative costs of high blood glucose observed at the within-species level. Such adaptations may explain the decoupled evolution of glycaemia and lifespan in birds and possibly the paradoxical combination of long lifespan and high blood glucose levels relative to mammals. Summary statementHigh blood glucose levels can harm organisms by causing oxidative stress. We show that, at the macroevolutionary level, songbirds defy this expectation, as their glucose levels and oxidative physiology are uncoupled.
Owolabi, A. T. Y.; Schneider, P.; Reece, S. E.
Show abstract
Asexually replicating stages of most malaria (Plasmodium spp.) parasite species replicate synchronously within the red blood cells of their vertebrate host. Rhythmicity in this intraerythrocytic developmental cycle (IDC) enables parasites to maximise exploitation of the host and align transmission activities with the time of day that mosquito vectors blood feed. The IDC is also responsible for the major pathologies associated with malaria, and plasticity in the parasites rhythm can confer tolerance to antimalarial drugs. Both the severity of infection (virulence) and synchrony of the IDC vary across species and between genotypes of Plasmodium, yet this variation is poorly understood. Theory predicts that virulence and IDC synchrony are negatively correlated and we tested this hypothesis using two closely related genotypes of the rodent malaria model Plasmodium chabaudi that differ markedly in virulence. We also test the predictions that in response to perturbations to the timing (phase) of the IDC schedule relative to the phase of host rhythms (misalignment), the virulent parasite genotype recovers the correct phase relationship faster, incurs less fitness loss, and so, hosts benefit less from misalignment of the virulent genotype. Our predictions are partially supported; the virulent parasite genotype was less synchronous in some circumstances and recovered faster from misalignment. While hosts were less anaemic when infected by misaligned parasites, the extent of this benefit did not depend on parasite virulence. Overall, our results suggest that interventions to perturb the alignment between the IDC schedule and host rhythms, and increase synchrony between parasites within each IDC, could alleviate disease symptoms. However, virulent parasites, which are better at withstanding conventional antimalarial treatment, would also be intrinsically better able to tolerate such interventions.
Nino, L. M. J.; Suess, A.; Heinze, J.; Schultner, E.; Oettler, J.
Show abstract
The evolutionary mechanisms that shape aging in social insects are not well understood. It is commonly assumed that queens live long and prosperous, while workers are regarded as a short-lived disposable caste because of their low reproductive potential. Queens of the ant Cardiocondyla obscurior gain high fitness late in life by increasing investment into sexual offspring as they age. This results in strong selection against senescence until shortly before death. Here, we show that workers have the same lifespan and shape of aging as queens, even though workers lack reproductive organs and cannot gain direct fitness. Under consideration of the prevailing aging theories and the biology of the species, we hypothesize that programmed aging has possibly evolved under kin selection. Impact statementMorphologically distinct fertile queen and sterile worker castes in the model ant Cardiocondyla obscurior show the same pace and shape of aging, contradicting the paradigm of queen/worker lifespan divergence in social insects.